Riemannian Optimization for Non-centered Mixture of Scaled Gaussian Distributions

نویسندگان

چکیده

This paper studies the statistical model of non-centered mixture scaled Gaussian distributions (NC-MSG). Using Fisher-Rao information geometry associated with this distribution, we derive a Riemannian gradient descent algorithm. algorithm is leveraged for two minimization problems. The first regularized negative log-likelihood (NLL). latter makes trade-off between white distribution and NC-MSG. Conditions on regularization are given so that existence minimum to problem guaranteed without assumptions samples. Then, Kullback-Leibler (KL) divergence NC-MSG derived. enables us define second problem. computation centers mass several NC-MSGs. Numerical experiments show good performance speed Finally, Nearest centrol̎d classifier implemented leveraging KL its center mass. Applied large-scale dataset xmlns:xlink="http://www.w3.org/1999/xlink">Breizhcrops , classifier shows accuracies robustness rigid transformations test set.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sigma Point Transformation for Gaussian Mixture Distributions

This paper describes the development of an approximate method for propagating uncertainty through stochastic dynamical systems using a quadrature rule integration based method. The development of quadrature rules for Gaussian mixture distributions is discussed. A numerical solution to this problem is considered that uses a Gram-Schmidt process. Simulation results are presented where the quadrat...

متن کامل

Manifold Optimization for Gaussian Mixture Models

We take a new look at parameter estimation for Gaussian Mixture Models (GMMs). In particular, we propose using Riemannian manifold optimization as a powerful counterpart to Expectation Maximization (EM). An out-of-the-box invocation of manifold optimization, however, fails spectacularly: it converges to the same solution but vastly slower. Driven by intuition from manifold convexity, we then pr...

متن کامل

A Reliability Application of a Mixture of Inverse Gaussian Distributions

Running Title: Mixture of Inverse GaU88ian Distributions 2 SUMMARY A mixture of Inverse GaU88ian distributions is examined as a model for the lifetime of components. The components differ in one of three ways: in their initial quality, rate of wear, or variability of wear. These three cases are well represented by the parameters of the Inverse GaU88ian model. The mechanistic interpretation of t...

متن کامل

Optimization of Gaussian Mixture Model Parameters for Speaker Identification

Gaussian mixture model (GMM) [1] has been widely used for modeling speakers. In speaker identification, one major problem is how to generate a set of GMMs for identification purposes based upon the training data. Due to the hill-climbing characteristic of the maximum likelihood (ML) method, any arbitrary estimate of the initial model parameters will usually lead to a sub-optimal model in practi...

متن کامل

On the Entropy Computation of Large Gaussian Mixture Distributions

The entropy computation of Gaussian mixture distributions with a large number of components has a prohibitive computational complexity. In this paper, we propose a novel approach exploiting the sphere decoding concept to bound and approximate such entropy terms with reduced complexity and good accuracy. Moreover, we propose an SNR regionbased enhancement of the approximation method to reduce th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Signal Processing

سال: 2023

ISSN: ['1053-587X', '1941-0476']

DOI: https://doi.org/10.1109/tsp.2023.3290354